14,891 research outputs found

    ‘Turnaround’ of Indian Railways: Increasing the Axle Loading

    Get PDF
    Axle loading had contributed significantly to the ‘turnaround’ of the Indian Railways (IR) in the two years 2004-06. As the Minister of Railways (MR) stated, “A one ton extra loading per wagon implied additional revenue of Rs 500 crore per annum for IR.” The axle loading initiative was a significant step by IR, though sustainability was a concern. This paper focuses on the key driving events, process issues, impact and implications, and sustainability of the initiative of taking the load per wagon from its carrying capacity (CC) to CC+8. Axle loading for a wagon had traditionally been 20.32 tons, except for the mainline versions of steam locomotives. In the early 1980s, the then Chairman of the Railway Board took initiative of increasing axle loading on an experimental basis which after his tenure, could not be sustained on the grounds of safety. In the late 90s, there were initiatives of regularizing the two ton slack normally permitted for excess loading for certain commodities which were usually on a short haul. The railway minister, during inspections in 2004, noticed significant overloading of many wagons in the iron ore and coal routes. This set him thinking on the axle loading initiative. When one of the Zonal Railways (ZR) proposed an increase of up to ten tons per four axled wagon, various directorates in the Railway Board (RB) gave their views, many of which opposed the initiative. The minister, through the RB, directed a variety of processes to bring about inter-departmental alignment, and the initiative was taken forward in a step by step manner over the two years over a large part of IR. The safety and research institutions of IR also had to be taken along. The initiative is still treated as an ‘experiment,’ with many issues that need resolution and strategizing.

    A Novel Optical/digital Processing System for Pattern Recognition

    Get PDF
    This paper describes two processing algorithms that can be implemented optically: the Radon transform and angular correlation. These two algorithms can be combined in one optical processor to extract all the basic geometric and amplitude features from objects embedded in video imagery. We show that the internal amplitude structure of objects is recovered by the Radon transform, which is a well-known result, but, in addition, we show simulation results that calculate angular correlation, a simple but unique algorithm that extracts object boundaries from suitably threshold images from which length, width, area, aspect ratio, and orientation can be derived. In addition to circumventing scale and rotation distortions, these simulations indicate that the features derived from the angular correlation algorithm are relatively insensitive to tracking shifts and image noise. Some optical architecture concepts, including one based on micro-optical lenslet arrays, have been developed to implement these algorithms. Simulation test and evaluation using simple synthetic object data will be described, including results of a study that uses object boundaries (derivable from angular correlation) to classify simple objects using a neural network

    Expressing the Behavior of Three Very Different Concurrent Systems by Using Natural Extensions of Separation Logic

    Full text link
    Separation Logic is a non-classical logic used to verify pointer-intensive code. In this paper, however, we show that Separation Logic, along with its natural extensions, can also be used as a specification language for concurrent-system design. To do so, we express the behavior of three very different concurrent systems: a Subway, a Stopwatch, and a 2x2 Switch. The Subway is originally implemented in LUSTRE, the Stopwatch in Esterel, and the 2x2 Switch in Bluespec

    Evaluating the reliability of NAND multiplexing with PRISM

    Get PDF
    Probabilistic-model checking is a formal verification technique for analyzing the reliability and performance of systems exhibiting stochastic behavior. In this paper, we demonstrate the applicability of this approach and, in particular, the probabilistic-model-checking tool PRISM to the evaluation of reliability and redundancy of defect-tolerant systems in the field of computer-aided design. We illustrate the technique with an example due to von Neumann, namely NAND multiplexing. We show how, having constructed a model of a defect-tolerant system incorporating probabilistic assumptions about its defects, it is straightforward to compute a range of reliability measures and investigate how they are affected by slight variations in the behavior of the system. This allows a designer to evaluate, for example, the tradeoff between redundancy and reliability in the design. We also highlight errors in analytically computed reliability bounds, recently published for the same case study

    Attractive Potential around a Thermionically Emitting Microparticle

    Full text link
    We present a simulation study of the charging of a dust grain immersed in a plasma, considering the effect of electron emission from the grain (thermionic effect). It is shown that the OML theory is no longer reliable when electron emission becomes large: screening can no longer be treated within the Debye-Huckel approach and an attractive potential well forms, leading to the possibility of attractive forces on other grains with the same polarity. We suggest to perform laboratory experiments where emitting dust grains could be used to create non-conventional dust crystals or macro-molecules.Comment: 3 figures. To appear on Physical Review Letter
    corecore